Sparse regression with Multi-type Regularized Feature modeling

نویسندگان

چکیده

Within the statistical and machine learning literature, regularization techniques are often used to construct sparse (predictive) models. Most strategies only work for data where all predictors treated identically, such as Lasso regression (continuous) linear effects. However, many predictive problems involve different types of require a tailored term. We propose multi-type penalty that acts on objective function sum subpenalties, one each type predictor. As such, we allow predictor selection level fusion within in data-driven way, simultaneous with parameter estimation process. develop new strategy convex models this penalty. Using theory proximal operators, our procedure is computationally efficient, partitioning overall optimization problem into easier solve subproblems, specific its associated Earlier research applies approximations non-differentiable penalties problem. The proposed SMuRF algorithm removes need achieves higher accuracy computational efficiency. This demonstrated an extensive simulation study analysis case-study insurance pricing analytics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse regularized local regression

The intention is to provide a Bayesian formulation of regularized local linear regression, combined with techniques for optimal bandwidth selection. This approach arises from the idea that only those covariates that are found to be relevant for the regression function should be considered by the kernel function used to define the neighborhood of the point of interest. However, the regression fu...

متن کامل

Sparse multi-stage regularized feature learning for robust face recognition

The major limitation in current facial recognition systems is that they do not perform very well in uncontrolled environments, that is, when faces present variations in pose, illumination, facial expressions and environment. This is a serious obstacle in applications such as law enforcement and surveillance systems. To address this limitation, in this paper we introduce an improved approach to ...

متن کامل

Regularized Sparse Kernel Slow Feature Analysis

This paper develops a kernelized slow feature analysis (SFA) algorithm. SFA is an unsupervised learning method to extract features which encode latent variables from time series. Generative relationships are usually complex, and current algorithms are either not powerful enough or tend to over-fit. We make use of the kernel trick in combination with sparsification to provide a powerful function...

متن کامل

Feature Selection via Block-Regularized Regression

Identifying co-varying causal elements in very high dimensional feature space with internal structures, e.g., a space with as many as millions of linearly ordered features, as one typically encounters in problems such as whole genome association (WGA) mapping, remains an open problem in statistical learning. We propose a block-regularized regression model for sparse variable selection in a high...

متن کامل

Learning Feature Nonlinearities with Non-Convex Regularized Binned Regression

For various applications, the relations between the dependent and independent variables are highlynonlinear. Consequently, for large scale complex problems, neural networks and regression trees arecommonly preferred over linear models such as Lasso. This work proposes learning the feature nonlinearitiesby binning feature values and finding the best fit in each quantile using non...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Insurance Mathematics & Economics

سال: 2021

ISSN: ['0167-6687', '1873-5959']

DOI: https://doi.org/10.1016/j.insmatheco.2020.11.010